Crowd-sourcing evaluation of automatically acquired, morphologically related word groupings

نویسندگان

  • Claudia Borg
  • Albert Gatt
چکیده

The automatic discovery and clustering of morphologically related words is an important problem with several practical applications. This paper describes the evaluation of word clusters carried out through crowd-sourcing techniques for the Maltese language. The hybrid (Semitic-Romance) nature of Maltese morphology, together with the fact that no large-scale lexical resources are available for Maltese, make this an interesting and challenging problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic Rating of User-Generated Math Solutions

Intelligent tutoring systems adapt to users’ cognitive factors, but typically not to affective or conative factors. Crowd-sourcing may be a way to create materials that engage a wide range of users along these differences. We build on our earlier work in crowd-sourcing worked example solutions and offer a data mining method for automatically rating the crowd-sourced examples to determine which ...

متن کامل

Annotating biomedical ontology terms in electronic health records using crowd-sourcing

Electronic health records have been adopted by many institutions and constitute an important source of biomedical information. Text mining methods can be applied to this type of information to automatically extract useful knowledge. We propose a crowd-sourcing pipeline to improve the precision of extraction and normalization of biomedical terms. Although crowd-sourcing has been applied in other...

متن کامل

For a fistful of dollars: using crowd-sourcing to evaluate a spoken language CALL application

We present an evaluation of a Web-deployed spoken language CALL system, carried out using crowd-sourcing methods. The system, “Survival Japanese”, is a crash course in tourist Japanese implemented within the platform CALL-SLT. The evaluation was carried out over one week using the Amazon Mechanical Turk. Although we found a high proportion of attempted scammers, there was a core of 23 subjects ...

متن کامل

Focus Annotation of Task-based Data: Establishing the Quality of Crowd Annotation

We explore the annotation of information structure in German and compare the quality of expert annotation with crowdsourced annotation taking into account the cost of reaching crowd consensus. Concretely, we discuss a crowd-sourcing effort annotating focus in a task-based corpus of German containing reading comprehension questions and answers. Against the backdrop of a gold standard reference r...

متن کامل

The Promise of a Crowd

This paper presents an evaluation of a mobile complaint and problem-reporting solution made for Swedish municipalities and their citizens. The evaluation is made through a government 2.0 framework to assess the appropriateness of the initiative as a citizen-sourcing solution. The research approach consists of a secondary analysis of empirical data. The researchers have been active participants ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014